23 research outputs found

    Lepskii Principle in Supervised Learning

    Full text link
    In the setting of supervised learning using reproducing kernel methods, we propose a data-dependent regularization parameter selection rule that is adaptive to the unknown regularity of the target function and is optimal both for the least-square (prediction) error and for the reproducing kernel Hilbert space (reconstruction) norm error. It is based on a modified Lepskii balancing principle using a varying family of norms

    Algorithm unfolding for block-sparse and MMV problems with reduced training overhead

    Get PDF
    In this study, we consider algorithm unfolding for the multiple measurement vector (MMV) problem in the case where only few training samples are available. Algorithm unfolding has been shown to empirically speed-up in a data-driven way the convergence of various classical iterative algorithms, but for supervised learning, it is important to achieve this with minimal training data. For this, we consider learned block iterative shrinkage thresholding algorithm (LBISTA) under different training strategies. To approach almost data-free optimization at minimal training overhead, the number of trainable parameters for algorithm unfolding has to be substantially reduced. We therefore explicitly propose a reduced-size network architecture based on the Kronecker structure imposed by the MMV observation model and present the corresponding theory in this context. To ensure proper generalization, we then extend the analytic weight approach by Liu and Chen to LBISTA and the MMV setting. Rigorous theoretical guarantees and convergence results are stated for this case. We show that the network weights can be computed by solving an explicit equation at the reduced MMV dimensions which also admits a closed-form solution. Toward more practical problems, we then considered convolutional observation models and show that the proposed architecture and the analytical weight computation can be further simplified and thus open new directions for convolutional neural networks. Finally, we evaluate the unfolded algorithms in numerical experiments and discuss connections to other sparse recovering algorithms

    The EnMAP imaging spectroscopy mission towards operations

    Get PDF
    EnMAP (Environmental Mapping and Analysis Program) is a high-resolution imaging spectroscopy remote sensing mission that was successfully launched on April 1st, 2022. Equipped with a prism-based dual-spectrometer, EnMAP performs observations in the spectral range between 418.2 nm and 2445.5 nm with 224 bands and a high radiometric and spectral accuracy and stability. EnMAP products, with a ground instantaneous field-of-view of 30 m x 30 m at a swath width of 30 km, allow for the qualitative and quantitative analysis of surface variables from frequently and consistently acquired observations on a global scale. This article presents the EnMAP mission and details the activities and results of the Launch and Early Orbit and Commissioning Phases until November 1st, 2022. The mission capabilities and expected performances for the operational Routine Phase are provided for existing and future EnMAP users
    corecore